Dimension Reduction in Regressions with Exponential Family Predictors
نویسندگان
چکیده
We present first methodology for dimension reduction in regressions with predictors that, given the response, follow one-parameter exponential families. Our approach is based on modeling the conditional distribution of the predictors given the response, which allows us to derive and estimate a sufficient reduction of the predictors. We also propose a method of estimating the forward regression mean function without requiring an explicit forward regression model. Whereas nearly all existing estimators of the central subspace are limited to regressions with continuous predictors only, our proposed methodology extends estimation to regressions with all categorical or a mixture of categorical and continuous predictors. Supplementary materials including the proofs and the computer code are available from the JCGS website. ∗R. Dennis Cook is Professor, School of Statistics, University of Minnesota, 313 Ford Hall, 224 Church Street SE, Minneapolis, MN 55455. email: [email protected]. Lexin Li is Assistant Professor, Department of Statistics, North Carolina State University, Raleigh, NC 27695. email: [email protected]. Research for this article was supported in part by National Science Foundation Grant DMS-0405360 awarded to RDC, and Grant DMS-0706919 awarded to LL.
منابع مشابه
Estimating Sufficient Reductions of the Predictors in Abundant High-dimensional Regressions by R. Dennis Cook1, Liliana Forzani
We study the asymptotic behavior of a class of methods for sufficient dimension reduction in high-dimension regressions, as the sample size and number of predictors grow in various alignments. It is demonstrated that these methods are consistent in a variety of settings, particularly in abundant regressions where most predictors contribute some information on the response, and oracle rates are ...
متن کاملSufficient dimension reduction and prediction in regression.
Dimension reduction for regression is a prominent issue today because technological advances now allow scientists to routinely formulate regressions in which the number of predictors is considerably larger than in the past. While several methods have been proposed to deal with such regressions, principal components (PCs) still seem to be the most widely used across the applied sciences. We give...
متن کاملOn Dimension Reduction in Regressions with Multivariate Responses
This paper is concerned with dimension reduction in regressions with multivariate responses on high-dimensional predictors. A unified method that can be regarded as either an inverse regression approach or a forward regression method is proposed to recover the central dimension reduction subspace. By using Stein’s Lemma, the forward regression estimates the first derivative of the conditional c...
متن کاملSufficient dimension reduction in regressions across heterogeneous subpopulations
Sliced inverse regression is one of the widely used dimension reduction methods. Chiaromonte and co-workers extended this method to regressions with qualitative predictors and developed a method, partial sliced inverse regression, under the assumption that the covariance matrices of the continuous predictors are constant across the levels of the qualitative predictor. We extend partial sliced i...
متن کاملSufficient Dimension Reduction via Inverse Regression: A Minimum Discrepancy Approach
A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regression estimator (IRE), is proposed, along with inference methods and a computational algorithm. The IRE has at least three desirable properties: (1) Its estimated basis of the central dimension reduction subspa...
متن کامل